220 research outputs found

    Population Firm Interaction and the Dynamics of Assimilation Gap

    Get PDF
    The paper shows that the interaction between population and firm knowledge produces a non-monotonic change in the assimilation gap. The assimilation gap follows a convex curve experiencing an upward slope driven by imitation and the downward slope by knowledge spillovers. Changes in the characteristics of innovation shift its peak across time. The relative advantage and compatibility shift the peak towards the left and the complexity shifts it to the right. The model is tested in a simulated environment and offers insights into the differences in temporal trajectories of the various adopter groups

    Microarray-based ultra-high resolution discovery of genomic deletion mutations

    Get PDF
    BACKGROUND: Oligonucleotide microarray-based comparative genomic hybridization (CGH) offers an attractive possible route for the rapid and cost-effective genome-wide discovery of deletion mutations. CGH typically involves comparison of the hybridization intensities of genomic DNA samples with microarray chip representations of entire genomes, and has widespread potential application in experimental research and medical diagnostics. However, the power to detect small deletions is low. RESULTS: Here we use a graduated series of Arabidopsis thaliana genomic deletion mutations (of sizes ranging from 4 bp to ~5 kb) to optimize CGH-based genomic deletion detection. We show that the power to detect smaller deletions (4, 28 and 104 bp) depends upon oligonucleotide density (essentially the number of genome-representative oligonucleotides on the microarray chip), and determine the oligonucleotide spacings necessary to guarantee detection of deletions of specified size. CONCLUSIONS: Our findings will enhance a wide range of research and clinical applications, and in particular will aid in the discovery of genomic deletions in the absence of a priori knowledge of their existence

    A Decision Support System for Moving Workloads to Public Clouds

    Get PDF
    The current economic environment is compellingCxOs to look for better IT resource utilization in order to get morevalue from their IT investments and reuse existing infrastructureto support growing business demands. How to get more from less?How to reuse the resources? How to minimize the Total Cost ofOwnership (TCO) of underlying IT infrastructure and data centeroperation cost? How to improve Return On Investment (ROI) toremain profitable and transform the IT cost center into a profitcenter? All of these questions are now being considered in light ofemerging ‘Public Cloud Computing’ services. Cloud Computingis a model for enabling resource allocation to dynamic businessworkloads in a real time manner from a pool of free resourcesin a cost effective manner. Providing resource on demand atcost effective pricing is not the only criteria when determiningif a business service workload can be moved to a public cloud.So what else must CxOs consider before they migrate to publiccloud environments? There is a need to validate the businessapplications and workloads in terms of technical portability andbusiness requirements/compliance so that they can be deployedinto a public cloud without considerable customization. Thisvalidation is not a simple task.In this paper, we will discuss an approach and the analytictooling which will help CxOs and their teams to automate theprocess of identifying business workloads that should move toa public cloud environment, as well as understanding its costbenefits. Using this approach, an organization can identify themost suitable business service workloads which could be movedto a public cloud environment from a private data center withoutre-architecting the applications or changing their business logic.This approach helps automate the classification and categorizationof workloads into various categories. For example, BusinessCritical (BC) and Non-business Critical (NBC) workloads canbe identified based on the role of business services within theoverall business function. The approach helps in the assessmentof public cloud providers on the basis of features and constraints.This approach provides consideration for industry complianceand the price model for hosting workloads on a pay-per-usebasis. Finally, the inbuilt analytics in the tool find the ‘best-fit’cloud provider for hosting the business service workload. ‘Bestfit’is based on analysis and outcomes of the previously mentionedsteps.Today, the industry follows a manual time consumingprocess for workload identification, workload classification andcloud provider assessment to find the best-fit for business serviceworkload hosting. The suggested automated approach enables anorganization to reduce cost and time when deciding to move toa public cloud environment. The proposed automated approachaccelerates the entire process of leveraging cloud benefits,through an effective, informed, fact-based decision process

    An Alternative String Landscape Cosmology: Eliminating Bizarreness

    Full text link
    In what has become a standard eternal inflation picture of the string landscape there are many problematic consequences and a difficulty defining probabilities for the occurrence of each type of universe. One feature in particular that might be philosophically disconcerting is the infinite cloning of each individual and each civilization in infinite numbers of separated regions of the multiverse. Even if this is not ruled out due to causal separation one should ask whether the infinite cloning is a universal prediction of string landscape models or whether there are scenarios in which it is avoided. If a viable alternative cosmology can be constructed one might search for predictions that might allow one to discriminate experimentally between the models. We present one such scenario although, in doing so, we are forced to give up several popular presuppositions including the absence of a preferred frame and the homogeneity of matter in the universe. The model also has several ancillary advantages. We also consider the future lifetime of the current universe before becoming a light trapping region.Comment: 13 pages, 1 figure, minor clarifications in version

    Potential pitfalls in MitoChip detected tumor-specific somatic mutations: a call for caution when interpreting patient data

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Several investigators have employed high throughput mitochondrial sequencing array (MitoChip) in clinical studies to search mtDNA for markers linked to cancers. In consequence, a host of somatic mtDNA mutations have been identified as linked to different types of cancers. However, closer examination of these data show that there are a number of potential pitfalls in the detection tumor-specific somatic mutations in clinical case studies, thus urging caution in the interpretation of mtDNA data to the patients. This study examined mitochondrial sequence variants demonstrated in cancer patients, and assessed the reliability of using detected patterns of polymorphisms in the early diagnosis of cancer.</p> <p>Methods</p> <p>Published entire mitochondrial genomes from head and neck, adenoid cystic carcinoma, sessile serrated adenoma, and lung primary tumor from clinical patients were examined in a phylogenetic context and compared with known, naturally occurring mutations which characterize different populations.</p> <p>Results</p> <p>The phylogenetic linkage analysis of whole arrays of mtDNA mutations from patient cancerous and non-cancerous tissue confirmed that artificial recombination events occurred in studies of head and neck, adenoid cystic carcinoma, sessile serrated adenoma, and lung primary tumor. Our phylogenetic analysis of these tumor and control leukocyte mtDNA haplotype sequences shows clear cut evidence of mixed ancestries found in single individuals.</p> <p>Conclusions</p> <p>Our study makes two prescriptions: both in the clinical situation and in research 1. more care should be taken in maintaining sample identity and 2. analysis should always be undertaken with respect to all the data available and within an evolutionary framework to eliminate artifacts and mix-ups.</p

    Changing Pattern of Human Listeriosis, England and Wales, 2001–2004

    Get PDF
    Disease has reemerged, mainly in patients ≥60 years of age with bacteremia

    A Bayesian Approach to the Evolution of Metabolic Networks on a Phylogeny

    Get PDF
    The availability of genomes of many closely related bacteria with diverse metabolic capabilities offers the possibility of tracing metabolic evolution on a phylogeny relating the genomes to understand the evolutionary processes and constraints that affect the evolution of metabolic networks. Using simple (independent loss/gain of reactions) or complex (incorporating dependencies among reactions) stochastic models of metabolic evolution, it is possible to study how metabolic networks evolve over time. Here, we describe a model that takes the reaction neighborhood into account when modeling metabolic evolution. The model also allows estimation of the strength of the neighborhood effect during the course of evolution. We present Gibbs samplers for sampling networks at the internal node of a phylogeny and for estimating the parameters of evolution over a phylogeny without exploring the whole search space by iteratively sampling from the conditional distributions of the internal networks and parameters. The samplers are used to estimate the parameters of evolution of metabolic networks of bacteria in the genus Pseudomonas and to infer the metabolic networks of the ancestral pseudomonads. The results suggest that pathway maps that are conserved across the Pseudomonas phylogeny have a stronger neighborhood structure than those which have a variable distribution of reactions across the phylogeny, and that some Pseudomonas lineages are going through genome reduction resulting in the loss of a number of reactions from their metabolic networks

    Acute myocardial infarction: profile and management at a tertiary care hospital in Karachi

    Get PDF
    Objective: Acute Myocardial Infarction (AMI) is a rising epidemic in developing countries. While studies in the West have established the characteristics and management of AMI patients, comprehensive data reflecting these issues in the Pakistani subjects is scarce. This study examined the profile and management of AMI in patients hospitalized at a tertiary care hospital in Karachi, Pakistan.Methods: Three hundred forty four patients admitted in 1998 with the diagnosis of AMI met our inclusion criteria. Data on presentation, investigations, monitoring and therapy was obtained. Chi-square and t tests were used to analyze the data.Results: Out of 344 patients with AMI, 71% were males; 58% had a Q wave MI. Majority of the patients who presented within 2 hours of symptom onset (36%), had chest pain. Patients with dyspnea and no chest pain were more likely to present after 12 hours of the onset of symptoms. In-house mortality was found to be 10.8%. Low HDL and diabetes was associated with in-hospital complications. Twenty nine percent of patients were given thrombolytic therapy with a mean door-to-needle time of 1 hour 36 minutes; 33% of patients who were eligible of Streptokinase did not receive it. Cardiac catheterization was performed in 28% patients. Echocardiography and Exercise Tolerance Test, both under utilized, were performed in 67% and 16% of patients, respectively. Two hundred sixteen (70%) patients discharged from hospital were contacted via telephone and the 1-year mortality rate among them was 28%.CONCLUSION: The profile and management of AMI was in coherence with earlier, Western studies. Chest pain units need to be established in the Emergency Room. Patients should be risk stratified prior to discharge. Public awareness regarding primary and secondary prevention and symptoms of AMI needs to be increased

    ReseqChip: Automated integration of multiple local context probe data from the MitoChip array in mitochondrial DNA sequence assembly

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The Affymetrix MitoChip v2.0 is an oligonucleotide tiling array for the resequencing of the human mitochondrial (mt) genome. For each of 16,569 nucleotide positions of the mt genome it holds two sets of four 25-mer probes each that match the heavy and the light strand of a reference mt genome and vary only at their central position to interrogate all four possible alleles. In addition, the MitoChip v2.0 carries alternative local context probes to account for known mtDNA variants. These probes have been neglected in most studies due to the lack of software for their automated analysis.</p> <p>Results</p> <p>We provide ReseqChip, a free software that automates the process of resequencing mtDNA using multiple local context probes on the MitoChip v2.0. ReseqChip significantly improves base call rate and sequence accuracy. ReseqChip is available at <url>http://code.open-bio.org/svnweb/index.cgi/bioperl/browse/bioperl-live/trunk/Bio/Microarray/Tools/</url>.</p> <p>Conclusions</p> <p>ReseqChip allows for the automated consolidation of base calls from alternative local mt genome context probes. It thereby improves the accuracy of resequencing, while reducing the number of non-called bases.</p
    • …
    corecore